Tensor rank-one decomposition of probability tables
نویسنده
چکیده
We propose a new additive decomposition of probability tables tensor rank-one decomposition. The basic idea is to decompose a probability table into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one-dimensional tables. Entries in tables are allowed to be any real number, i.e. they can be also negative numbers. The possibility of having negative numbers, in contrary to a multiplicative decomposition, opens new possibilities for a compact representation of probability tables. We show that tensor rank-one decomposition can be used to reduce the space and time requirements in probabilistic inference. We provide a closed form solution for minimal tensor rank-one decomposition for some special tables and propose a numerical algorithm that can be used in cases when the closed form solution is not known.
منابع مشابه
Exploiting tensor rank-one decomposition in probabilistic inference
We propose a new additive decomposition of probability tables – tensor rank-one decomposition. The basic idea is to decompose a probability table into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one-dimensional tables. Entries in table...
متن کاملAn Approximate Tensor-Based Inference Method Applied to the Game of Minesweeper
We propose an approximate probabilistic inference method based on the CP-tensor decomposition and apply it to the well known computer game of Minesweeper. In the method we view conditional probability tables of the exactly -out-of-k functions as tensors and approximate them by a sum of rank-one tensors. The number of the summands is min{l + 1, k − l + 1}, which is lower than their exact symmetr...
متن کاملTriangulation Heuristics for BN2O Networks
A BN2O network is a Bayesian network having the structure of a bipartite graph with all edges directed from one part (the top level) toward the other (the bottom level) and where all conditional probability tables are noisy-or gates. In order to perform efficient inference, graphical transformations of these networks are performed. The efficiency of inference is proportional to the total table ...
متن کاملProbabilistic inference with noisy-threshold models based on a CP tensor decomposition
The specification of conditional probability tables (CPTs) is a difficult task in the construction of probabilistic graphical models. Several types of canonical models have been proposed to ease that difficulty. Noisy-threshold models generalize the two most popular canonical models: the noisy-or and the noisy-and. When using the standard inference techniques the inference complexity is exponen...
متن کاملComputationally efficient probabilistic inference with noisy threshold models based on a CP tensor decomposition
Conditional probability tables (CPTs) of threshold functions represent a generalization of two popular models – noisy-or and noisy-and. They constitute an alternative to these two models in case they are too rough. When using the standard inference techniques the inference complexity is exponential with respect to the number of parents of a variable. In case the CPTs take a special form (in thi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005